Cumulative Measure of Inaccuracy and Mutual Information in k-th Lower Record Values

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

More Results on Dynamic Cumulative Inaccuracy Measure

In this paper, borrowing the intuition in Rao et al. (2004), we introduce a cumulative version of the inaccuracy measure (CIM). Also we obtain interesting and applicable properties of CIM for different cases based on the residual, past and interval lifetime random variables. Relying on various applications of stochastic classes in reliability and information theory fields, we stu...

متن کامل

Some Characterization and Relations Based on K-th Lower Record Values

Abstract: In this paper, we obtain certain expressions and recurrence relations for two general classes of distributions based on some conditional expectations of k-th lower record values. We consider the necessary and sufficient conditions such that these conditional expectations hold for some distribution functions. Furthermore, an expression of conditional expectation of other general class ...

متن کامل

Non-Additive Entropy Measure and Record Values

Non-additive entropy measures are important for many applications. We study Havrda and Charvat entropy for record values and have shown that this characterizes the underlying distribution function uniquely. Also the non-additive entropy of record values has been derived in case of some specific distributions. Further we propose a generalized residual entropy measure for record value.

متن کامل

Quantile Approach of Generalized Cumulative Residual Information Measure of Order $(alpha,beta)$

In this paper, we introduce the concept of quantile-based generalized cumulative residual entropy of order $(alpha,beta)$ for residual and past lifetimes and study their properties. Further we study the proposed information measure for series and parallel system when random variable are untruncated or truncated in nature and some characterization results are presented. At the end, we study gene...

متن کامل

Lower bounds on mutual information.

We correct claims about lower bounds on mutual information (MI) between real-valued random variables made by Kraskov et al., Phys. Rev. E 69, 066138 (2004). We show that non-trivial lower bounds on MI in terms of linear correlations depend on the marginal (single variable) distributions. This is so in spite of the invariance of MI under reparametrizations, because linear correlations are not in...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematics

سال: 2019

ISSN: 2227-7390

DOI: 10.3390/math7020175